Limitations of state estimation: absolute lower bound of minimum variance estimation/filtering, Gaussianity-whiteness measure (joint Shannon-Wiener entropy), and Gaussianing-whitening filter (maximum Gaussianity-whiteness measure principle)

نویسنده

  • Song Fang
چکیده

This paper aims at obtaining performance limitations of state estimation in terms of variance minimization (minimum variance estimation and filtering) using information theory. Two new notions, negentropy rate and Gaussianity-whiteness measure (joint Shannon-Wiener entropy), are proposed to facilitate the analysis. Topics such as Gaussianing-whitening filter (the maximum Gaussianity-whiteness measure principle) are also discussed.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Three laws of feedback systems: entropy rate never decreases, generalized Bode integral, absolute lower bound in variance minimization, Gaussianity-whiteness measure (joint Shannon-Wiener entropy), Gaussianing-whitening control, and beyond

This paper aims at obtaining universal laws and absolute lower bounds of feedback systems using information theory. The feedback system setup is that with causal plants and causal controllers. Three laws (entropy rate never decreases, generalized Bode integral, and absolute lower bound in variance minimization) are obtained, which are in entropy domain, frequency domain, and time domain, respec...

متن کامل

Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples

The Minimum Mutual Information (MinMI) Principle provides the least committed, maximum-joint-entropy (ME) inferential law that is compatible with prescribed marginal distributions and empirical cross constraints. Here, we estimate MI bounds (the MinMI values) generated by constraining sets Tcr comprehended by mcr linear and/or nonlinear joint expectations, computed from samples of N iid outcome...

متن کامل

Finite sample effects of the fast ICA algorithm

Many algorithms for independent component analysis (ICA) and blind source separation (BSS) can be considered particular instances of a criterion based on the sum of two terms: C(Y), which expresses the decorrelation of the components and G(Y), which measures their non-Gaussianity. Within this framework, the popular FastICA algorithm can be regarded as a technique that keeps C(Y) 1⁄4 0 by first ...

متن کامل

Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties

The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaus...

متن کامل

Aalborg Universitet Non - Gaussian , Non - stationary and Nonlinear Signal Processing Methods - with Applications to Speech Processing and Channel Estimation

The Gaussian statistic model, despite its mathematical elegance, is found to be too factitious for many real world signals, as manifested by its unsatisfactory performance when applied to non-Gaussian signals. Traditional non-Gaussian signal processing techniques, on the other hand, are usually associated with high complexities and low data efficiencies. This thesis addresses the problem of opt...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014